3,200 research outputs found

    Design Effects in the Transition to Web-Based Surveys

    Get PDF
    Innovation within survey modes should always be mitigated by concerns about survey quality and in particular sampling, coverage, nonresponse, and measurement error. This is as true today with the development of web surveying as it was in the 1970s when telephone surveying was being developed. This paper focuses on measurement error in web surveys. Although Internet technology provides significant opportunities for innovation in survey design, systematic research has yet to be conducted on how most of the possible innovations might affect measurement error, leaving many survey designers “out in the cold.” This paper summarizes recent research to provide an overview of how choosing the web mode affects the asking and answering of questions. It starts with examples of how question formats used in other survey modes perform differently in the web mode. It then provides examples of how the visual design of web surveys can influence answers in unexpected ways and how researchers can strategically use visual design to get respondents to provide their answers in a desired format. Finally, the paper concludes with suggested guidelines for web survey design

    The Role of Email Communications in Determining Response Rates and Mode of Participation in a Mixed-mode Design

    Get PDF
    This article is concerned with the extent to which the propensity to participate in a web-face-to-face sequential mixed-mode survey is influenced by the ability to communicate with sample members by email in addition to mail. Researchers may be able to collect email addresses for sample members and to use them subsequently to send survey invitations and reminders. However, there is little evidence regarding the value of doing so. This makes it difficult to decide what efforts should be made to collect such information and how to subsequently use it efficiently. Using evidence from a randomized experiment within a large mixed-mode national survey, we find that using a respondent-supplied email address to send additional survey invites and reminders does not affect survey response rate but is associated with an increased proportion of responses by web rather than face to face and, hence, lower survey costs

    The effectiveness of crowdsourcing in knowledge-based industries: the moderating role of transformational leadership and organisational learning

    Get PDF
    [EN] Crowdsourcing provides an opportunity for SMEs to exploit collective knowledge that is located outside the organisation. Crowdsourcing allows organisations to keep pace with a fast-changing environment by solving business problems, supporting R&D activities, and fostering innovation cheaply, flexibly, and dynamically. Nevertheless, managing crowdsourcing is difficult, and positive outcomes are not guaranteed. Drawing on the Resource-based View, we study transformational leadership and organisational learning capability as complementary assets to help SMEs deploy crowdsourcing. An empirical study of Spanish telecommunications and biotechnology companies confirmed the moderating effect of organisational learning on the relationship between crowdsourcing and organisational performance.Devece Carañana, CA.; Palacios Marqués, D.; Ribeiro-Navarrete, B. (2019). The effectiveness of crowdsourcing in knowledge-based industries: the moderating role of transformational leadership and organisational learning. Economic Research-Ekonomska Istra ivanja. 32(1):335-351. https://doi.org/10.1080/1331677X.2018.1547204S335351321Amitay, M., Popper, M., & Lipshitz, R. (2005). Leadership styles and organizational learning in community clinics. The Learning Organization, 12(1), 57-70. doi:10.1108/09696470510574269Atapattu, M., & Ranawake, G. (2017). Transformational and Transactional Leadership Behaviours and their Effect on Knowledge Workers’ Propensity for Knowledge Management Processes. Journal of Information & Knowledge Management, 16(03), 1750026. doi:10.1142/s0219649217500265Aragón-Correa, J. A., García-Morales, V. J., & Cordón-Pozo, E. (2007). Leadership and organizational learning’s role on innovation and performance: Lessons from Spain. Industrial Marketing Management, 36(3), 349-359. doi:10.1016/j.indmarman.2005.09.006Bal, A. S., Weidner, K., Hanna, R., & Mills, A. J. (2017). Crowdsourcing and brand control. Business Horizons, 60(2), 219-228. doi:10.1016/j.bushor.2016.11.006Barney, J. (1991). Firm Resources and Sustained Competitive Advantage. Journal of Management, 17(1), 99-120. doi:10.1177/014920639101700108Colli, A. (2011). Contextualizing Performances of Family Firms. Family Business Review, 25(3), 243-257. doi:10.1177/0894486511426872Bhatti, W. A., Larimo, J., & Carrasco, I. (2016). Strategy’s effect on knowledge sharing in host country networks. Journal of Business Research, 69(11), 4769-4774. doi:10.1016/j.jbusres.2016.04.028Bruque-Cámara, S., Vargas-Sánchez, A., & Hernández-Ortiz, M. J. (2004). Organizational determinants of IT adoption in the pharmaceutical distribution sector. European Journal of Information Systems, 13(2), 133-146. doi:10.1057/palgrave.ejis.3000490Cáceres, R., Guzmán, J., & Rekowski, M. (2011). Firms as source of variety in innovation: influence of size and sector. International Entrepreneurship and Management Journal, 7(3), 357-372. doi:10.1007/s11365-011-0198-8Chi, H.-K., Lan, C.-H., & Dorjgotov, B. (2012). The Moderating Effect of Transformational Leadership on Knowledge Management and Organizational Effectiveness. Social Behavior and Personality: an international journal, 40(6), 1015-1023. doi:10.2224/sbp.2012.40.6.1015Chiva, R., & Alegre, J. (2005). Organizational Learning and Organizational Knowledge. Management Learning, 36(1), 49-68. doi:10.1177/1350507605049906Chiva, R., Alegre, J., & Lapiedra, R. (2007). Measuring organisational learning capability among the workforce. International Journal of Manpower, 28(3/4), 224-242. doi:10.1108/01437720710755227Coelho, D. A., Nunes, F., & Vieira, F. L. (2016). The impact of crowdsourcing in product development: an exploratory study of Quirky based on the perspective of participants. International Journal of Design Creativity and Innovation, 6(1-2), 114-128. doi:10.1080/21650349.2016.1216331Conant, J. S., Mokwa, M. P., & Varadarajan, P. R. (1990). Strategic types, distinctive marketing competencies and organizational performance: A multiple measures-based study. Strategic Management Journal, 11(5), 365-383. doi:10.1002/smj.4250110504Devece, C., Palacios, D., & Martinez-Simarro, D. (2016). Effect of information management capability on organizational performance. Service Business, 11(3), 563-580. doi:10.1007/s11628-016-0320-7Dimitrova, S., & Scarso, E. (2017). The impact of crowdsourcing on the evolution of knowledge management: Insights from a case study. Knowledge and Process Management, 24(4), 287-295. doi:10.1002/kpm.1552Elkins, T., & Keller, R. T. (2003). Leadership in research and development organizations: A literature review and conceptual framework. The Leadership Quarterly, 14(4-5), 587-606. doi:10.1016/s1048-9843(03)00053-5Estellés-Arolas, E., & González-Ladrón-de-Guevara, F. (2012). Towards an integrated crowdsourcing definition. Journal of Information Science, 38(2), 189-200. doi:10.1177/0165551512437638Flostrand, A. (2017). Finding the future: Crowdsourcing versus the Delphi technique. Business Horizons, 60(2), 229-236. doi:10.1016/j.bushor.2016.11.007García-Morales, V. J., Lloréns-Montes, F. J., & Verdú-Jover, A. J. (2008). The Effects of Transformational Leadership on Organizational Performance through Knowledge and Innovation*. British Journal of Management, 19(4), 299-319. doi:10.1111/j.1467-8551.2007.00547.xGrant, R. M. (1991). The Resource-Based Theory of Competitive Advantage: Implications for Strategy Formulation. California Management Review, 33(3), 114-135. doi:10.2307/41166664King, W. R. (Ed.). (2009). Knowledge Management and Organizational Learning. Annals of Information Systems. doi:10.1007/978-1-4419-0011-1Lang, M., Bharadwaj, N., & Di Benedetto, C. A. (2016). How crowdsourcing improves prediction of market-oriented outcomes. Journal of Business Research, 69(10), 4168-4176. doi:10.1016/j.jbusres.2016.03.020Lee, J., & Seo, D. (2016). Crowdsourcing not all sourced by the crowd: An observation on the behavior of Wikipedia participants. Technovation, 55-56, 14-21. doi:10.1016/j.technovation.2016.05.002Leimeister, J. M., Huber, M., Bretschneider, U., & Krcmar, H. (2009). Leveraging Crowdsourcing: Activation-Supporting Components for IT-Based Ideas Competition. Journal of Management Information Systems, 26(1), 197-224. doi:10.2753/mis0742-1222260108Liu, S., Xia, F., Zhang, J., & Wang, L. (2016). How crowdsourcing risks affect performance: an exploratory model. Management Decision, 54(9), 2235-2255. doi:10.1108/md-12-2015-0604Marjanovic, S., Fry, C., & Chataway, J. (2012). Crowdsourcing based business models: In search of evidence for innovation 2.0. Science and Public Policy, 39(3), 318-332. doi:10.1093/scipol/scs009McEvily, S. K., & Chakravarthy, B. (2002). The persistence of knowledge-based advantage: an empirical test for product performance and technological knowledge. Strategic Management Journal, 23(4), 285-305. doi:10.1002/smj.223Melville, Kraemer, & Gurbaxani. (2004). Review: Information Technology and Organizational Performance: An Integrative Model of IT Business Value. MIS Quarterly, 28(2), 283. doi:10.2307/25148636Naqshbandi, M. M., & Tabche, I. (2018). The interplay of leadership, absorptive capacity, and organizational learning culture in open innovation: Testing a moderated mediation model. Technological Forecasting and Social Change, 133, 156-167. doi:10.1016/j.techfore.2018.03.017Assis Neto, F. R., & Santos, C. A. S. (2018). Understanding crowdsourcing projects: A systematic review of tendencies, workflow, and quality management. Information Processing & Management, 54(4), 490-506. doi:10.1016/j.ipm.2018.03.006Nonaka, I., & Konno, N. (1998). The Concept of «Ba»: Building a Foundation for Knowledge Creation. California Management Review, 40(3), 40-54. doi:10.2307/41165942Noruzy, A., Dalfard, V. M., Azhdari, B., Nazari-Shirkouhi, S., & Rezazadeh, A. (2012). Relations between transformational leadership, organizational learning, knowledge management, organizational innovation, and organizational performance: an empirical investigation of manufacturing firms. The International Journal of Advanced Manufacturing Technology, 64(5-8), 1073-1085. doi:10.1007/s00170-012-4038-yPalacios‐Marqués, D., Peris‐Ortiz, M., & Merigó, J. M. (2013). The effect of knowledge transfer on firm performance. Management Decision, 51(5), 973-985. doi:10.1108/md-08-2012-0562Palacios, M., Martinez-Corral, A., Nisar, A., & Grijalvo, M. (2016). Crowdsourcing and organizational forms: Emerging trends and research implications. Journal of Business Research, 69(5), 1834-1839. doi:10.1016/j.jbusres.2015.10.065Peris-Ortiz, M., Devece-Carañana, C. A., & Navarro-Garcia, A. (2018). Organizational learning capability and open innovation. Management Decision, 56(6), 1217-1231. doi:10.1108/md-02-2017-0173Piezunka, H., & Dahlander, L. (2015). Distant Search, Narrow Attention: How Crowding Alters Organizations’ Filtering of Suggestions in Crowdsourcing. Academy of Management Journal, 58(3), 856-880. doi:10.5465/amj.2012.0458Podsakoff, P. M., MacKenzie, S. B., Lee, J.-Y., & Podsakoff, N. P. (2003). Common method biases in behavioral research: A critical review of the literature and recommended remedies. Journal of Applied Psychology, 88(5), 879-903. doi:10.1037/0021-9010.88.5.879Prpić, J., Shukla, P. P., Kietzmann, J. H., & McCarthy, I. P. (2015). How to work a crowd: Developing crowd capital through crowdsourcing. Business Horizons, 58(1), 77-85. doi:10.1016/j.bushor.2014.09.005Qin, S., Van Der Velde, D., Chatzakis, E., McStea, T., & Smith, N. (2016). Exploring barriers and opportunities in adopting crowdsourcing based new product development in manufacturing SMEs. Chinese Journal of Mechanical Engineering, 29(6), 1052-1066. doi:10.3901/cjme.2016.0808.089Rai, Patnayakuni, & Seth. (2006). Firm Performance Impacts of Digitally Enabled Supply Chain Integration Capabilities. MIS Quarterly, 30(2), 225. doi:10.2307/25148729RAVICHANDRAN, T., LERTWONGSATIEN, C., & LERTWONGSATIEN, C. (2005). Effect of Information Systems Resources and Capabilities on Firm Performance: A Resource-Based Perspective. Journal of Management Information Systems, 21(4), 237-276. doi:10.1080/07421222.2005.11045820Ray, G., Barney, J. B., & Muhanna, W. A. (2003). Capabilities, business processes, and competitive advantage: choosing the dependent variable in empirical tests of the resource-based view. Strategic Management Journal, 25(1), 23-37. doi:10.1002/smj.366Schlagwein, D., & Bjorn-Andersen, N. (2014). Organizational Learning with Crowdsourcing: The Revelatory Case of LEGO. Journal of the Association for Information Systems, 15(11), 754-778. doi:10.17705/1jais.00380Schmallegger, D., & Carson, D. (2008). Blogs in tourism: Changing approaches to information exchange. Journal of Vacation Marketing, 14(2), 99-110. doi:10.1177/1356766707087519Stanko, M. A., Molina-Castillo, F.-J., & Harmancioglu, N. (2014). It Won’t Fit! For Innovative Products, Sometimes That’s for the Best. Journal of Product Innovation Management, 32(1), 122-137. doi:10.1111/jpim.12238STOCK, R. M., & SCHNARR, N. L. (2016). EXPLORING THE PRODUCT INNOVATION OUTCOMES OF CORPORATE CULTURE AND EXECUTIVE LEADERSHIP. International Journal of Innovation Management, 20(01), 1650009. doi:10.1142/s1363919616500092Templeton, G. F., Lewis, B. R., & Snyder, C. A. (2002). Development of a Measure for the Organizational Learning Construct. Journal of Management Information Systems, 19(2), 175-218. doi:10.1080/07421222.2002.11045727Tohidi, H., & Jabbari, M. M. (2012). Measuring organizational learning capability. Procedia - Social and Behavioral Sciences, 31, 428-432. doi:10.1016/j.sbspro.2011.12.079Vukovic, M. (2009). Crowdsourcing for Enterprises. 2009 Congress on Services - I. doi:10.1109/services-i.2009.56Wade, & Hulland. (2004). Review: The Resource-Based View and Information Systems Research: Review, Extension, and Suggestions for Future Research. MIS Quarterly, 28(1), 107. doi:10.2307/25148626Wu, I.-L., & Chen, J.-L. (2014). Knowledge management driven firm performance: the roles of business process capabilities and organizational learning. Journal of Knowledge Management, 18(6), 1141-1164. doi:10.1108/jkm-05-2014-0192Xie, X., Wang, L., & Zeng, S. (2018). Inter-organizational knowledge acquisition and firms’ radical innovation: A moderated mediation analysis. Journal of Business Research, 90, 295-306. doi:10.1016/j.jbusres.2018.04.038Ye, H. (Jonathan), & Kankanhalli, A. (2015). Investigating the antecedents of organizational task crowdsourcing. Information & Management, 52(1), 98-110. doi:10.1016/j.im.2014.10.007Zollo, M., & Winter, S. G. (2002). Deliberate Learning and the Evolution of Dynamic Capabilities. Organization Science, 13(3), 339-351. doi:10.1287/orsc.13.3.339.278

    Open-Ended Questions in Web Surveys: Can Increasing the Size of Answer Boxes and Providing Extra Verbal Instructions Improve Response Quality?

    Get PDF
    Previous research has revealed techniques to improve response quality in open-ended questions in both paper and interviewer-administered survey modes. The purpose of this paper is to test the effectiveness of similar techniques in web surveys. Using data from a series of three random sample web surveys of Washington State University undergraduates, we examine the effects of visual and verbal answer-box manipulations (i.e., altering the size of the answer box and including an explanation that answers could exceed the size of the box) and the inclusion of clarifying and motivating introductions in the question stem. We gauge response quality by the amount and type of information contained in responses as well as response time and item nonresponse. The results indicate that increasing the size of the answer box has little effect on early responders to the survey but substantially improved response quality among late responders. Including any sort of explanation or introduction that made response quality and length salient also improved response quality for both early and late responders. In addition to discussing these techniques, we also address the potential of the web survey mode to revitalize the use of open-ended questions in self-administered surveys

    Open-Ended Questions in Web Surveys: Can Increasing the Size of Answer Boxes and Providing Extra Verbal Instructions Improve Response Quality?

    Get PDF
    Previous research has revealed techniques to improve response quality in open-ended questions in both paper and interviewer-administered survey modes. The purpose of this paper is to test the effectiveness of similar techniques in web surveys. Using data from a series of three random sample web surveys of Washington State University undergraduates, we examine the effects of visual and verbal answer-box manipulations (i.e., altering the size of the answer box and including an explanation that answers could exceed the size of the box) and the inclusion of clarifying and motivating introductions in the question stem. We gauge response quality by the amount and type of information contained in responses as well as response time and item nonresponse. The results indicate that increasing the size of the answer box has little effect on early responders to the survey but substantially improved response quality among late responders. Including any sort of explanation or introduction that made response quality and length salient also improved response quality for both early and late responders. In addition to discussing these techniques, we also address the potential of the web survey mode to revitalize the use of open-ended questions in self-administered surveys

    Comparing Check-All and Forced-Choice Question Formats in Web Surveys

    Get PDF
    For survey researchers, it is common practice to use the check-all question format in Web and mail surveys but to convert to the forced-choice question format in telephone surveys. The assumption underlying this practice is that respondents will answer the two formats similarly. In this research note we report results from 16 experimental comparisons in two Web surveys and a paper survey conducted in 2002 and 2003 that test whether the check-all and forced-choice formats produce similar results. In all 16 comparisons, we find that the two question formats do not perform similarly; respondents endorse more options and take longer to answer in the forced-choice format than in the check-all format. These findings suggest that the forced-choice question format encourages deeper processing of response options and, as such, is preferable to the check-all format, which may encourage a weak satisficing response strategy. Additional analyses show that neither acquiescence bias nor item nonresponse seem to pose substantial problems for use of the forced-choice question format in Web surveys

    The uptake of different tillage practices in England

    Get PDF
    Reduced tillage systems have been argued to provide several potential benefits to soil, environment and to farm incomes. In England, while many farms have partially adopted such practices, a large proportion of arable farmers do not undertake reduced tillage in any form. This paper analyses the rationale for and uptake of different cultivation techniques, including analysis of the barriers to adoption of reduced tillage, aiming to benefit policy makers and researchers and increase the spread of smart agricultural practices. Based on a postal questionnaire, we estimated that 47.6% of English arable land is cultivated using minimum‐tillage and 7% under no‐tillage. As farm size increased, so did the probability of reduced tillage uptake. Furthermore, farms growing combinable crops were more likely to utilise reduced tillage approaches than other farm types. Soil type, weed control and weather conditions were noted as the main drivers for ‘strategic' and ‘rotational' ploughing, constraining continuous reduced tillage use. To effect greater reduced tillage uptake, greater communication between researchers and farmers is needed to facilitate the implementation of sustainable soil management solutions, supported by current legislation permitting responsible herbicide use in arable production. Financial support to access reduced tillage machinery may also be required for farmers operating smaller holdings. Adopting reduced tillage is a continuous learning process requiring ongoing training and information‐gathering; supporting a network of reduced tillage ‘farmer champions' would facilitate practical knowledge exchange, allow farmers to observe soil improvements, understand transition phase barriers, and ultimately encourage increased reduced tillage uptake

    Impact of Vehicle Flexibility on IRVE-II Flight Dynamics

    Get PDF
    The Inflatable Re-entry Vehicle Experiment II (IRVE-II) successfully launched from Wallops Flight Facility (WFF) on August 17, 2009. The primary objectives of this flight test were to demonstrate inflation and re-entry survivability, assess the thermal and drag performance of the reentry vehicle, and to collect flight data for refining pre-flight design and analysis tools. Post-flight analysis including trajectory reconstruction outlined in O Keefe3 demonstrated that the IRVE-II Research Vehicle (RV) met mission objectives but also identified a few anomalies of interest to flight dynamics engineers. Most notable of these anomalies was high normal acceleration during the re-entry pressure pulse. Deflection of the inflatable aeroshell during the pressure pulse was evident in flight video and identified as the likely cause of the anomaly. This paper provides a summary of further post-flight analysis with particular attention to the impact of aeroshell flexibility on flight dynamics and the reconciliation of flight performance with pre-flight models. Independent methods for estimating the magnitude of the deflection of the aeroshell experienced on IRVE-II are discussed. The use of the results to refine models for pre-flight prediction of vehicle performance is then described

    So you call that research? : mending methodological biases in strategy and organization departments of top business schools

    Get PDF
    We believe that all strategy and organization (SO) scholars should be able to decide for themselves whether to specialize in certain parts of the knowledge cycle or adopt a broader, multi-method view on the scientific process. In a situation of ―methodological pluralism‖, individuals might choose to contribute to the construction of new administrative theories by means of qualitative works like case studies, ethnographies, biographies, or grounded theory studies (e.g., see Denzin and Lincoln, 2000). Others could then specialize in testing these theories by means of experiments, surveys, or longitudinal econometric studies (e.g., see Lewis-Beck, 1987-2004). Again others could combine both approaches in Herculean attempts to conduct high-impact, integrative research with the potential to change the way we understand the field as a whole

    Awareness and use of biodiversity collections by fish biologists

    Full text link
    A survey of 280 fish biologists from a diverse pool of disciplines was conducted in order to assess the use made of biodiversity collections and how collections can better collect, curate and share the data they have. From the responses, data for how fish biologists use collections, what data they find the most useful, what factors influence the decisions to use collections, how they access the data and explore why some fish biologists make the decision to not use biodiversity collections is collated and reported. The results of which could be used to formulate sustainability plans for collections administrators and staff who curate fish biodiversity collections, while also highlighting the diversity of data and uses to researchers.Peer Reviewedhttps://deepblue.lib.umich.edu/bitstream/2027.42/154475/1/jfb14167.pdfhttps://deepblue.lib.umich.edu/bitstream/2027.42/154475/2/jfb14167_am.pd
    corecore